956 research outputs found

    Spectral Classification; Old and Contemporary

    Full text link
    Beginning with a historical account of the spectral classification, its refinement through additional criteria is presented. The line strengths and ratios used in two dimensional classifications of each spectral class are described. A parallel classification scheme for metal-poor stars and the standards used for classification are presented. The extension of spectral classification beyond M to L and T and spectroscopic classification criteria relevant to these classes are described. Contemporary methods of classifications based upon different automated approaches are introduced.Comment: To be published in "Principles and Perspectives in Cosmochemistry" Lecture Notes on Kodai School on Synthesis of Elements in Stars: Ed Aruna Goswami & Eswar Reddy, Springer Verlag, 2009, 17 pages, 10 figure

    Testing the cognitive-behavioural maintenance models across DSM-5 bulimic-type eating disorder diagnostic groups: A multi-centre study

    Get PDF
    The original cognitive-behavioural (CB) model of bulimia nervosa, which provided the basis for the widely used CB therapy, proposed that specific dysfunctional cognitions and behaviours maintain the disorder. However, amongst treatment completers, only 40–50 % have a full and lasting response. The enhanced CB model (CB-E), upon which the enhanced version of the CB treatment was based, extended the original approach by including four additional maintenance factors. This study evaluated and compared both CB models in a large clinical treatment seeking sample (N = 679), applying both DSM-IV and DSM-5 criteria for bulimic-type eating disorders. Application of the DSM-5 criteria reduced the number of cases of DSM-IV bulimic-type eating disorders not otherwise specified to 29.6 %. Structural equation modelling analysis indicated that (a) although both models provided a good fit to the data, the CB-E model accounted for a greater proportion of variance in eating-disordered behaviours than the original one, (b) interpersonal problems, clinical perfectionism and low self-esteem were indirectly associated with dietary restraint through over-evaluation of shape and weight, (c) interpersonal problems and mood intolerance were directly linked to binge eating, whereas restraint only indirectly affected binge eating through mood intolerance, suggesting that factors other than restraint may play a more critical role in the maintenance of binge eating. In terms of strength of the associations, differences across DSM-5 bulimic-type eating disorder diagnostic groups were not observed. The results are discussed with reference to theory and research, including neurobiological findings and recent hypotheses

    A Comparison of Administrative and Physiologic Predictive Models in Determining Risk Adjusted Mortality Rates in Critically Ill Patients

    Get PDF
    Hospitals are increasingly compared based on clinical outcomes adjusted for severity of illness. Multiple methods exist to adjust for differences between patients. The challenge for consumers of this information, both the public and healthcare providers, is interpreting differences in risk adjustment models particularly when models differ in their use of administrative and physiologic data. We set to examine how administrative and physiologic models compare to each when applied to critically ill patients.We prospectively abstracted variables for a physiologic and administrative model of mortality from two intensive care units in the United States. Predicted mortality was compared through the Pearsons Product coefficient and Bland-Altman analysis. A subgroup of patients admitted directly from the emergency department was analyzed to remove potential confounding changes in condition prior to ICU admission.We included 556 patients from two academic medical centers in this analysis. The administrative model and physiologic models predicted mortalities for the combined cohort were 15.3% (95% CI 13.7%, 16.8%) and 24.6% (95% CI 22.7%, 26.5%) (t-test p-value<0.001). The r(2) for these models was 0.297. The Bland-Atlman plot suggests that at low predicted mortality there was good agreement; however, as mortality increased the models diverged. Similar results were found when analyzing a subgroup of patients admitted directly from the emergency department. When comparing the two hospitals, there was a statistical difference when using the administrative model but not the physiologic model. Unexplained mortality, defined as those patients who died who had a predicted mortality less than 10%, was a rare event by either model.In conclusion, while it has been shown that administrative models provide estimates of mortality that are similar to physiologic models in non-critically ill patients with pneumonia, our results suggest this finding can not be applied globally to patients admitted to intensive care units. As patients and providers increasingly use publicly reported information in making health care decisions and referrals, it is critical that the provided information be understood. Our results suggest that severity of illness may influence the mortality index in administrative models. We suggest that when interpreting "report cards" or metrics, health care providers determine how the risk adjustment was made and compares to other risk adjustment models

    The Pioneer Anomaly

    Get PDF
    Radio-metric Doppler tracking data received from the Pioneer 10 and 11 spacecraft from heliocentric distances of 20-70 AU has consistently indicated the presence of a small, anomalous, blue-shifted frequency drift uniformly changing with a rate of ~6 x 10^{-9} Hz/s. Ultimately, the drift was interpreted as a constant sunward deceleration of each particular spacecraft at the level of a_P = (8.74 +/- 1.33) x 10^{-10} m/s^2. This apparent violation of the Newton's gravitational inverse-square law has become known as the Pioneer anomaly; the nature of this anomaly remains unexplained. In this review, we summarize the current knowledge of the physical properties of the anomaly and the conditions that led to its detection and characterization. We review various mechanisms proposed to explain the anomaly and discuss the current state of efforts to determine its nature. A comprehensive new investigation of the anomalous behavior of the two Pioneers has begun recently. The new efforts rely on the much-extended set of radio-metric Doppler data for both spacecraft in conjunction with the newly available complete record of their telemetry files and a large archive of original project documentation. As the new study is yet to report its findings, this review provides the necessary background for the new results to appear in the near future. In particular, we provide a significant amount of information on the design, operations and behavior of the two Pioneers during their entire missions, including descriptions of various data formats and techniques used for their navigation and radio-science data analysis. As most of this information was recovered relatively recently, it was not used in the previous studies of the Pioneer anomaly, but it is critical for the new investigation.Comment: 165 pages, 40 figures, 16 tables; accepted for publication in Living Reviews in Relativit

    A framework for the probabilistic analysis of meteotsunamis

    Get PDF
    This paper is not subject to U.S. copyright. The definitive version was published in Natural Hazards 74 (2014): 123-142, doi:10.1007/s11069-014-1294-1.A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO

    Performance of CMS muon reconstruction in pp collision events at sqrt(s) = 7 TeV

    Get PDF
    The performance of muon reconstruction, identification, and triggering in CMS has been studied using 40 inverse picobarns of data collected in pp collisions at sqrt(s) = 7 TeV at the LHC in 2010. A few benchmark sets of selection criteria covering a wide range of physics analysis needs have been examined. For all considered selections, the efficiency to reconstruct and identify a muon with a transverse momentum pT larger than a few GeV is above 95% over the whole region of pseudorapidity covered by the CMS muon system, abs(eta) < 2.4, while the probability to misidentify a hadron as a muon is well below 1%. The efficiency to trigger on single muons with pT above a few GeV is higher than 90% over the full eta range, and typically substantially better. The overall momentum scale is measured to a precision of 0.2% with muons from Z decays. The transverse momentum resolution varies from 1% to 6% depending on pseudorapidity for muons with pT below 100 GeV and, using cosmic rays, it is shown to be better than 10% in the central region up to pT = 1 TeV. Observed distributions of all quantities are well reproduced by the Monte Carlo simulation.Comment: Replaced with published version. Added journal reference and DO
    • …
    corecore